XClose

Gatsby Computational Neuroscience Unit

Home
Menu

Michael Jordan

 

Thursday 17th November 2016

Time: 2.00pm

 

Ground Floor Seminar Room

25 Howland Street, London, W1T 4JG

 

A Variational Perspective on Accelerated Methods in Optimization

Abstract:
Accelerated gradient methods play a central role in optimization, achieving
optimal rates in many settings. While many generalizations and extensions of
Nesterov's original acceleration method have been proposed, it is not yet
clear what is the natural scope of the acceleration concept. In this paper,
we study accelerated methods from a continuous-time perspective. We show
that there is a Lagrangian functional that we call the Bregman Lagrangian
which generates a large class of accelerated methods in continuous time,
including (but not limited to) accelerated gradient descent, its non-Euclidean
extension, and accelerated higher-order gradient methods. We show that the
continuous-time limit of all of these methods correspond to traveling the
same curve in spacetime at different speeds. From this perspective, Nesterov's
technique and many of its generalizations can be viewed as a systematic way
to go from the continuous-time curves generated by the Bregman Lagrangian
to a family of discrete-time accelerated algorithms. [Joint work with
Andre Wibisono and Ashia Wilson.]

Bio:
Michael I. Jordan is the Pehong Chen Distinguished Professor in the
Department of Electrical Engineering and Computer Science and the
Department of Statistics at the University of California, Berkeley.
He received his Masters in Mathematics from Arizona State University,
and earned his PhD in Cognitive Science in 1985 from the University of
California, San Diego. He was a professor at MIT from 1988 to 1998.
His research interests bridge the computational, statistical, cognitive
and biological sciences, and have focused in recent years on Bayesian
nonparametric analysis, probabilistic graphical models, spectral
methods, kernel machines and applications to problems in distributed computing
systems, natural language processing, signal processing and statistical
genetics. Prof. Jordan is a member of the National Academy
of Sciences, a member of the National Academy of Engineering and a
member of the American Academy of Arts and Sciences. He is a
Fellow of the American Association for the Advancement of Science.
He has been named a Neyman Lecturer and a Medallion Lecturer by the
Institute of Mathematical Statistics. He received the IJCAI Research
Excellence Award in 2016, the David E. Rumelhart Prize in 2015 and
the ACM/AAAI Allen Newell Award in 2009. He is a Fellow of the AAAI,
ACM, ASA, CSS, IEEE, IMS, ISBA and SIAM.